A Riemannian Network for SPD Matrix Learning

نویسندگان

  • Zhiwu Huang
  • Luc Van Gool
چکیده

Symmetric Positive Definite (SPD) matrix learning methods have become popular in many image and video processing tasks, thanks to their ability to learn appropriate statistical representations while respecting the Riemannian geometry of the underlying SPD manifold. In this paper we build a Riemannian network to open up a new direction of SPD matrix non-linear learning in a deep architecture. The built network generalizes the Euclidean network paradigm to non-Euclidean SPD manifolds. In particular, we devise bilinear mapping layers to transform input SPD matrices into more desirable SPD matrices, exploit eigenvalue rectification layers to introduce the non-linearity with a nonlinear function on the new SPD matrices, and design eigenvalue logarithm layers to perform Log-Euclidean Riemannian computing on the resulting SPD matrices for regular output layers. For training the deep network, we propose a Riemannian matrix backpropagation by exploiting a variant of stochastic gradient descent on Stiefel manifolds where the network weights reside on. We show through experiments that the proposed SPD network can be simply trained and outperform existing SPD matrix learning and state-ofthe-art methods in three typical visual classification tasks.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A Riemannian Network for SPD Matrix Learnin

Symmetric Positive Definite (SPD) matrix learning methods have become popular in many image and video processing tasks, thanks to their ability to learn appropriate statistical representations while respecting Riemannian geometry of underlying SPD manifolds. In this paper we build a Riemannian network architecture to open up a new direction of SPD matrix non-linear learning in a deep model. In ...

متن کامل

A Geometry Preserving Kernel over Riemannian Manifolds

Abstract- Kernel trick and projection to tangent spaces are two choices for linearizing the data points lying on Riemannian manifolds. These approaches are used to provide the prerequisites for applying standard machine learning methods on Riemannian manifolds. Classical kernels implicitly project data to high dimensional feature space without considering the intrinsic geometry of data points. ...

متن کامل

Geometry-aware Similarity Learning on SPD Manifolds for Visual Recognition

Symmetric Positive Definite (SPD) matrices have been widely used for data representation in many visual recognition tasks. The success mainly attributes to learning discriminative SPD matrices with encoding the Riemannian geometry of the underlying SPD manifold. In this paper, we propose a geometry-aware SPD similarity learning (SPDSL) framework to learn discriminative SPD features by directly ...

متن کامل

DeepKSPD: Learning Kernel-matrix-based SPD Representation for Fine-grained Image Recognition

Being symmetric positive-definite (SPD), covariance matrix has traditionally been used to represent a set of local descriptors in visual recognition. Recent study shows that kernel matrix can give considerably better representation by modelling the nonlinearity in the local descriptor set. Nevertheless, neither the descriptors nor the kernel matrix is deeply learned. Worse, they are considered ...

متن کامل

Deep manifold-to-manifold transforming network for action recognition

In this paper, a novel deep manifold-to-manifold transforming network (DMT-Net) is proposed for action recognition, in which symmetric positive definite (SPD) matrix is adopted to describe the spatial-temporal information of action feature vectors. Since each SPD matrix is a point of the Riemannian manifold space, the proposed DMT-Net aims to learn more discriminative feature by hierarchically ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2017